A Nonlinear GMRES Optimization Algorithm for Canonical Tensor Decomposition

نویسنده

  • Hans De Sterck
چکیده

A new algorithm is presented for computing a canonical rank-R tensor approximation that has minimal distance to a given tensor in the Frobenius norm, where the canonical rank-R tensor consists of the sum of R rank-one tensors. Each iteration of the method consists of three steps. In the first step, a tentative new iterate is generated by a stand-alone one-step process, for which we use alternating least squares (ALS). In the second step, an accelerated iterate is generated by a nonlinear generalized minimal residual (GMRES) approach, recombining previous iterates in an optimal way, and essentially using the stand-alone one-step process as a preconditioner. In particular, the nonlinear extension of GMRES we use that was proposed by Washio and Oosterlee in [Electron. Trans. Numer. Anal., 15 (2003), pp. 165–185] for nonlinear partial differential equation problems (which is itself related to other existing acceleration methods for nonlinear equation systems). In the third step, a line search is performed for globalization. The resulting nonlinear GMRES (N-GMRES) optimization algorithm is applied to dense and sparse tensor decomposition test problems. The numerical tests show that ALS accelerated by N-GMRES may significantly outperform stand-alone ALS when highly accurate stationary points are desired for difficult problems. Further comparison tests show that N-GMRES is competitive with the well-known nonlinear conjugate gradient method for the test problems considered and outperforms it in many cases. The proposed N-GMRES optimization algorithm is based on general concepts and may be applied to other nonlinear optimization problems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimization-Based Algorithms for Tensor Decompositions: Canonical Polyadic Decomposition, Decomposition in Rank-(Lr, Lr, 1) Terms, and a New Generalization

The canonical polyadic and rank-(Lr , Lr , 1) block term decomposition (CPD and BTD, respectively) are two closely related tensor decompositions. The CPD and, recently, BTD are important tools in psychometrics, chemometrics, neuroscience, and signal processing. We present a decomposition that generalizes these two and develop algorithms for its computation. Among these algorithms are alternatin...

متن کامل

Steepest Descent Preconditioning for Nonlinear GMRES Optimization

Steepest descent preconditioning is considered for the recently proposed nonlinear generalized minimal residual (N-GMRES) optimization algorithm for unconstrained nonlinear optimization. Two steepest descent preconditioning variants are proposed. The first employs a line search, while the second employs a predefined small step. A simple global convergence proof is provided for the NGMRES optimi...

متن کامل

A Scalable Optimization Approach for Fitting Canonical Tensor Decompositions

Tensor decompositions are higher-order analogues of matrix decompositions and have proven to be powerful tools for data analysis. In particular, we are interested in the canonical tensor decomposition, otherwise known as CANDECOMP/PARAFAC (CP), which expresses a tensor as the sum of component rank-one tensors and is used in a multitude of applications such as chemometrics, signal processing, ne...

متن کامل

A Nonlinearly Preconditioned Conjugate Gradient Algorithm for Rank-R Canonical Tensor Approximation

Alternating least squares (ALS) is often considered the workhorse algorithm for computing the rank-R canonical tensor approximation, but for certain problems its convergence can be very slow. The nonlinear conjugate gradient (NCG) method was recently proposed as an alternative to ALS, but the results indicated that NCG is usually not faster than ALS. To improve the convergence speed of NCG, we ...

متن کامل

Multi-preconditioned Gmres

Standard Krylov subspace methods only allow the user to choose a single preconditioner, although in many situations there may be a number of possibilities. Here we describe an extension of GMRES, multi-preconditioned GMRES, which allows the use of more than one preconditioner. We give some theoretical results, propose a practical algorithm, and present numerical results from problems in domain ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM J. Scientific Computing

دوره 34  شماره 

صفحات  -

تاریخ انتشار 2012